Doubly Sparsifying Network

نویسندگان

  • Zhangyang Wang
  • Shuai Huang
  • Jiayu Zhou
  • Thomas S. Huang
چکیده

We propose the doubly sparsifying network (DSN), by drawing inspirations from the double sparsity model for dictionary learning. DSN emphasizes the joint utilization of both the problem structure and the parameter structure. It simultaneously sparsifies the output features and the learned model parameters, under one unified framework. DSN enjoys intuitive model interpretation, compact model size and low complexity. We compare DSN against a few carefully-designed baselines, and verify its consistently superior performance in a wide range of settings. Encouraged by its robustness to insufficient training data, we explore the applicability of DSN in brain signal processing that has been a challenging interdisciplinary area. DSN is evaluated for two mainstream tasks: electroencephalographic (EEG) signal classification and blood oxygenation level dependent (BOLD) response prediction, and achieves promising results in both cases.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Combating Adversarial Attacks Using Sparse Representations

It is by now well-known that small adversarial perturbations can induce classification errors in deep neural networks (DNNs). In this paper, we make the case that sparse representations of the input data are a crucial tool for combating such attacks. For linear classifiers, we show that a sparsifying front end is provably effective against `∞-bounded attacks, reducing output distortion due to t...

متن کامل

Re-Weighted Learning for Sparsifying Deep Neural Networks

This paper addresses the topic of sparsifying deep neural networks (DNN’s). While DNN’s are powerful models that achieve state-of-the-art performance on a large number of tasks, the large number of model parameters poses serious storage and computational challenges. To combat these difficulties, a growing line of work focuses on pruning network weights without sacrificing performance. We propos...

متن کامل

Reduced order model for doubly output induction generator in wind park using integral manifold theory

A dynamic reduced order model using integral manifold theory has been derived, which can be used to simulate the DOIG wind turbine using a double-winding representation of the generator rotor. The model is suitable for use in transient stability programs that can be used to investigate large power systems. The behavior of a wind farm and the network under various system disturbances was stu...

متن کامل

Learning Filter Bank Sparsifying Transforms

Data is said to follow the transform (or analysis) sparsity model if it becomes sparse when acted on by a linear operator called a sparsifying transform. Several algorithms have been designed to learn such a transform directly from data, and data-adaptive sparsifying transforms have demonstrated excellent performance in signal restoration tasks. Sparsifying transforms are typically learned usin...

متن کامل

Online Learning Sensing Matrix and Sparsifying Dictionary Simultaneously for Compressive Sensing

This paper considers simultaneously optimizing the Sensing Matrix and Sparsifying Dictionary (SMSD) on a large training dataset. We propose an online algorithm that consists of a closed-form solution for optimizing the sensing matrix with a fixed sparsifying dictionary and a stochastic method for optimizing the sparsifying dictionary on a large training dataset when the sensing matrix is fixed....

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017